Matrix exponential

Given a matrix $K$, we define a new matrix $\exp(K)$ as

$$ \exp(K)=\mathrm{id}+K + \frac{K^{2}}{2 !}+\cdots $$

It can be shown that it is well defined, i.e., this series is convergent. The proof involves the definition of a convenient norm in the matrix space, etcetera.

Matrix exponential has a bunch of good properties:

$$ A=\left( \begin{array}{ccccc}{0} & {1} & {0} & {\dots} & {0} \\ {0} & {0} & {1} & {\dots} & {0} \\ {\vdots} & {\vdots} & {\vdots} & {\ddots} & {\vdots} \\ {0} & {0} & {0} & {\dots} & {1} \\ {0} & {0} & {0} & {\ldots} & {0}\end{array}\right) $$

then

$$ e^{A t}=\left( \begin{array}{ccccc}{1} & {t} & {t^{2} / 2} & {\dots} & {t^{n-1} /(n-1) !} \\ & {1} & {t} & {\cdots} & {\vdots} \\ & { } & {1} & {\cdots} & {t^{2} / 2} \\ &{ } & { } & {\ddots} & {t} \\ &{ } & { } & {\ddots} & {1}\end{array}\right) $$

Proof

First, check that $A$ can be seen as the matrix of the derivative operator on polynomial-of-degree-less-than-n vector space, if we take the basis $\{1,\frac{x}{1!}, \frac{x^2}{2!}, \frac{x^3}{3!}\}$. And the other matrix above is the matrix of the linear operator $H_t:p(x)\mapsto p(x+t)$ acting on the same vector space with the same basis.

Now, classical Taylor's theorem states that

$$ e^{At}(p(x))=(\mathrm{id}+tA + t^2\frac{A^{2}}{2 !}+\cdots )(p(x))=p(x)+tp'(x)+t^2\frac{p''(x)}{2!}+\cdots=p(x+t)=H_t(p(x)) $$

That is to say

$$ exp(At)=H_t $$

$\blacksquare$

Given any $K \in \mathcal{M}(n, \mathbb{R})$, let $\gamma(t)=\exp (t K)$ for $t \in \mathbb{R}$. Observe that $\gamma$ is differentiable, and $\gamma'(t)$ can be obtained by term by term differentiation in the infinite sum (it results from convergence theorems). In fact, $\gamma'(t)=K\gamma(t)$.

Moreover, we can check that $\gamma: \mathbb{R} \mapsto GL(n,\mathbb{R})$ is a one-parameter subgroup with $\gamma^{\prime}(0)=K$

The surprise is that the converse is also true:

Proposition

If

$$ \gamma : \mathbb{R} \rightarrow G L(n, \mathbb{R}) $$

is a one-parameter subgroup then

$$ \gamma(t)=\exp (t A) $$

where $A=\gamma'(0)$

$\blacksquare$

Proof

It is based on the existence and uniqueness theorem of ODEs. Let $y=\gamma(t) x$, where $x$ is a constant vector in $\mathbb{R}^n$. Then, with the definition of the derivative we check that $y$ is a solution of

$$ y^{\prime}=A y $$

with $A=\gamma'(0)$. But $e^{At}$ is another one...

$\blacksquare$

Observe that $\gamma(\mathbb{R})$ could be contained in any subgroup $G$ of $GL(n,\mathbb{R})$. In this case, $A \in \mathfrak{g}$. On the contrary, if we take any $A \in \mathfrak{g}$, then $\exp (t A)\in G$ (proof?).

Thus, if we have a matrix $M_{\tau}$ depending on the parameter $\tau$ in such a way that $\{M_{\tau}\}$ constitutes a one-parameter group then

$$ M_{\tau}=\mathrm{id}+K \tau+K^{2} \frac{\tau^{2}}{2 !}+\cdots =\exp(\tau K); \tau \in \mathbb{R} $$

for some matrix $K$ not depending on $\tau$.

Another important result, although more general that matrices groups, is:

Proposition

Given $F:G_1\longmapsto G_2$ a differentiable group homomorphism, then

$$ F(\exp(A))=\exp(dF(A)) $$

$\blacksquare$

________________________________________

________________________________________

________________________________________

Author of the notes: Antonio J. Pan-Collantes

antonio.pan@uca.es


INDEX: